In this short notebook we provide several interactive demonstrations of the fundamental role mathematical optimization plays in regression.
Every learning problem has parameters that must be tuned properly to ensure optimal learning. For example, there are two parameters that must be properly tuned in the case linear regression (with one dimensional input): the slope and intercept of the linear model. These two parameters are tune by forming a cost function - a continuous function in both parameters - that measures how well the linear model fits a dataset given a value for its slope and intercept. The proper tuning of these parameters via the cost function corresponds geometrically to finding the values for the parameters that make the cost function as small as possible or, in other words, minimize the cost function. In the image below you can see how choosing a set of parameters higher on the cost function results in a corresponding linear fit that is poorer than the one corresponding to parameters at the lowest point on the cost surface.

In the first demonstration we plot a toy dataset (left panel), its corresponding Least Squares cost function surface (middle panel), and the contours of this function (right panel). After activating the Python cell below these panels will appear. Using your mouse click on any point in the right panel - which shows the contours of the cost function. Any point pressed in this panel will be highlighted on the cost surface in the middle panel, and the corresponding line generated by this choice of parameters will be drawn on the data in the left panel.
Notice how points chosen on the lower part of the cost function provide a superior fit to the data, while points chosen higher up - on the outer rings of the contour - provide a very poor fit.
# add backend files - this needs to be run before any of the demos
import sys
sys.path.append('../../../demo_python_backend_files')
# import statements
%matplotlib nbagg
import sys
sys.path.append('../../../demo_python_backend_files')
from Regression_Demo1 import Regression_Demo1
# call the demonstration
csvname = '../../../demo_datasets/toy_linear_regression_data.csv'
demo = Regression_Demo1(csvname)
Here we take the same dataset shown in the previous demo and run gradient descent to find the global minimum of the Least Squares cost function. Using gradient descent we gradually reach lower and lower on the cost function, eventually reaching the minimum of the function, which corresponds to the proper choice of parameters for this fit.
# import statements
%matplotlib inline
import ml_optimization_1dim_sliders as demo
demo = demo.ml_optimization_1dim_sliders()
# call the demonstration
csvname = '../../../demo_datasets/toy_linear_regression_data.csv'
demo.load_data(csvname)
# run gradient descent
demo.run_lin_regression_grad_descent(inits = [-2.5,-2.5],max_its = 60)
# run slider
demo.fitting_slider(xlabel = 'intercept', ylabel= 'slope', view = [30,70], fit_type = 'line fit')
Here we demonstrate how to fit a sine wave to a noisy periodic dataset. Using gradient descent we gradually reach lower and lower on the cost function, eventually reaching the minimum of the function, which corresponds to the proper choice of parameters for this fit.
Even though we are fitting a nonlinear sinusoidal to this dataset, for technical reasons (the model is linear in its parameters - see Chapter 3 of the text for further details) this too is called a linear fit or a linear regression.
# import statements
%matplotlib inline
import ml_optimization_1dim_sliders as demo
demo = demo.ml_optimization_1dim_sliders()
# call the demonstration
csvname = '../../../demo_datasets/toy_sinusoidal_regression_data.csv'
demo.load_data(csvname,'sin')
# run gradient descent
demo.run_lin_regression_grad_descent(inits = [-2.5,2.5],max_its = 30)
# run slider
demo.fitting_slider(xlabel = 'phase', ylabel= 'amplitude',view = [60,20], fit_type = 'sine fit')
The content of this notebook is supplementary material for the textbook Machine Learning Refined (Cambridge University Press, 2016). Visit http://mlrefined.com for free chapter downloads and tutorials, and our Amazon site for details regarding a hard copy of the text.